Regaining sparsity in kernel principal components

نویسندگان

  • César Ignacio García-Osorio
  • Colin Fyfe
چکیده

Support Vector Machines are supervised regression and classification machines which have the nice property of automatically identifying which of the data points are most important in creating the machine. Kernel Principal Component Analysis (KPCA) is a related technique in that it also relies on linear operations in a feature space but does not have this ability to identify important points. Sparse KPCA goes too far in that it identifies a single data point as most important. We show how, by bagging the data, we may create a compromise which gives us a sparse but not grandmother representation for KPCA. r 2005 Elsevier B.V. All rights reserved.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sparsity and Error Analysis of Empirical Feature-Based Regularization Schemes

We consider a learning algorithm generated by a regularization scheme with a concave regularizer for the purpose of achieving sparsity and good learning rates in a least squares regression setting. The regularization is induced for linear combinations of empirical features, constructed in the literatures of kernel principal component analysis and kernel projection machines, based on kernels and...

متن کامل

Sparsity in machine learning: theory and practice

The thesis explores sparse machine learning algorithms for supervised (classification and re­ gression) and unsupervised (subspace methods) learning. For classification, we review the set covering machine (SCM) and propose new algorithms th a t directly minimise the SCMs sample compression generalisation error bounds during the training phase. Two of the resulting algo­ rithm s are proved to pr...

متن کامل

Predicting the Young\'s Modulus and Uniaxial Compressive Strength of a typical limestone using the Principal Component Regression and Particle Swarm Optimization

In geotechnical engineering, rock mechanics and engineering geology, depending on the project design, uniaxial strength and static Youngchr('39')s modulus of rocks are of vital importance. The direct determination of the aforementioned parameters in the laboratory, however, requires intact and high-quality cores and preparation of their specimens have some limitations. Moreover, performing thes...

متن کامل

Robust Sparse Principal Component Analysis

A method for principal component analysis is proposed that is sparse and robust at the same time. The sparsity delivers principal components that have loadings on a small number of variables, making them easier to interpret. The robustness makes the analysis resistant to outlying observations. The principal components correspond to directions that maximize a robust measure of the variance, with...

متن کامل

Notes on Pca, Regularization, Sparsity and Support Vector Machines

We derive a new representation for a function as a linear combination of local correlation kernels at optimal sparse locations and discuss its relation to PCA, regularization, sparsity principles and Support Vector Machines. We also discuss its Bayesian interpretation and justiication. We rst review previous results for the approximation of a function from discrete data (Girosi, 1998) in the co...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Neurocomputing

دوره 67  شماره 

صفحات  -

تاریخ انتشار 2005